Internal Representations by Error Propagation , " In
نویسندگان
چکیده
21 at dierent places along the dendrites). [Hasselmo et al.] show that acetylcholine aects the synaptic connection strength of lateral but not feedforward connections in rat piriform cortex. Neuromodulators such as this could provide a means for neurons to give more or less weighting to the \teach" inputs. The neuromodulators or the teach inputs themselves could serve as a gate for learning ensuring that weight modication only occurs during times of appropriate teaching. It should also be noted that occasional misgating would not be disastrous. The teaching input could be considered as input from other sensory modalities (or sub-modalities as in colour helping shape) possibly through the limbic system. Some of these modalities would be receiving nearly constant input and thus could serve as a teacher for others receiving changing stimuli. There is evidence that proprioceptive input is necessary for the development of visually guided reaching [Hein, 1974] and depth perception [Graves et al., 1987]. Alternatively teaching input could come from temporally contiguous views. Miyashita in [Miyashita, 1988] has found neurons in macaque inferotemporal cortex that associated \visually distinct but temporally related" images. The article suggests that the strong limbic connections are important in this association. Kohonen mapping algorithms have been shown to account well for the retinotopic maps observed in striate cortex [Obermayer et al., 1990]. It is possible that the addition of the teach input with recurrent connections may explain how higher areas achieve position invariant responses and object-centered representations [Rolls, 1990]. This theory would also explain the purpose of the prevalent back-projections throughout cortex. [Eckhorn et al., 1988] found that lesioning V2 had little aect on the response properties of cells in V1 indicating that back-projections, or at least the V2-V1 projection has little purpose during regular processing. In this report we have provided some support for the hypothesis that these back-projections are important for the organization of the incoming sensory stimuli during learning. Acknowledgements We would like to thank Steven Nowlan and Leonidas Kontothanassis for their helpful comments and suggestions on earlier drafts. 20 4 Conclusions We have seen that the addition of a teach input with recurrent projections greatly increases the uses of the competitive learning algorithm and may make it suitable for such tasks as position-invariant recognition. Although the algorithm has only been demonstrated on small problems, it was not specialized in any way for the above tasks. In fact given enough hidden layer …
منابع مشابه
Prospective Hardware Implementation of the Chir Neural Network Algorithm
I review the recently developed Choice of Internal Representations (CHIR) training algorithm for multi-layer perceptrons, with an emphasis on relevant properties for hardware implementation. A comparison to the common error back-propagation algorithm shows that there are potential advantages in realizing CHIR in hard-
متن کاملSemi-Supervised Affective Meaning Lexicon Expansion Using Semantic and Distributed Word Representations
In this paper, we propose an extension to graph-based sentiment lexicon induction methods by incorporating distributed and semantic word representations in building the similarity graph to expand a threedimensional sentiment lexicon. We also implemented and evaluated the label propagation using four different word representations and similarity metrics. Our comprehensive evaluation of the four ...
متن کاملLearning by Choice of Internal Representations: An Energy Minimization Approach
Learning by choice of internal representations (CHIR) is a learning algorithm for a multilayer neural network system, suggested by Grossman et al, [1,2) and based upon determining the internal representations of the system as well as its internal weights. In this paper, we propose an energy minimization approach whereby the internal representations (IR) as well as the weight matrix are allowed ...
متن کاملDiffusion Tensor Representations and Their Applications to DTI Error Propagation
INTRODUCTION The diffusion tensor is a 3x3 positive definite matrix and, therefore, possesses several distinct matrix decompositions, e.g. the Cholesky, and the Eigenvalue decompositions. To date, the Eigenvalue decomposition has been used only in computing tensor-derived quantities [1-4] but not as a parametrization (or equivalently, a representation) in DTI error propagation [5]. Treating a m...
متن کاملEstimation of pull-in instability voltage of Euler-Bernoulli micro beam by back propagation artificial neural network
The static pull-in instability of beam-type micro-electromechanical systems is theoretically investigated. Two engineering cases including cantilever and double cantilever micro-beam are considered. Considering the mid-plane stretching as the source of the nonlinearity in the beam behavior, a nonlinear size-dependent Euler-Bernoulli beam model is used based on a modified couple stress theory, c...
متن کاملInternal Error Propagation in Explicit Runge-Kutta Methods
In practical computation with Runge–Kutta methods, the stage equations are not satisfied exactly, due to roundoff errors, algebraic solver errors, and so forth. We show by example that propagation of such errors within a single step can have catastrophic effects for otherwise practical and well-known methods. We perform a general analysis of internal error propagation, emphasizing that it depen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1991